Goto

Collaborating Authors

 path query



Computationally and statistically efficient learning of causal Bayes nets using path queries

Neural Information Processing Systems

Causal discovery from empirical data is a fundamental problem in many scientific domains. Observational data allows for identifiability only up to Markov equivalence class. In this paper we first propose a polynomial time algorithm for learning the exact correctly-oriented structure of the transitive reduction of any causal Bayesian network with high probability, by using interventional path queries. Each path query takes as input an origin node and a target node, and answers whether there is a directed path from the origin to the target. This is done by intervening on the origin node and observing samples from the target node. We theoretically show the logarithmic sample complexity for the size of interventional data per path query, for continuous and discrete networks. We then show how to learn the transitive edges using also logarithmic sample complexity (albeit in time exponential in the maximum number of parents for discrete networks), which allows us to learn the full network. We further extend our work by reducing the number of interventional path queries for learning rooted trees. We also provide an analysis of imperfect interventions.



e43739bba7cdb577e9e3e4e42447f5a5-AuthorFeedback.pdf

Neural Information Processing Systems

We thank the reviewers for their time and valuable feedback. Below, we clarify a number of important points raised by the reviewers. Reviewers raise concern on multi-modal embeddings. We will highlight this limitation in Sec. R3 suggests that "the authors can adapt the FOL queries to other We argue the differences in tasks and setups below.


Reviews: Computationally and statistically efficient learning of causal Bayes nets using path queries

Neural Information Processing Systems

This paper gives algorithms for recovering the structure of causal Bayesian networks. The main focus is on using path queries, that is asking whether a direct path exists between two nodes. Unlike with descendant queries, with path queries one could only hope to recover the transitive structure (an equivalence class of graphs). The main contribution here is to show that at least this can be done in polynomial time, while each query relies on interventions that require only a logarithmic number of samples. The author do this for discrete and sub-Gaussian random variables, show how the result can be patched up to recover the actual graph, and suggest specializations (rooted trees) and extensions (imperfect interventions).


Pathformer: Recursive Path Query Encoding for Complex Logical Query Answering

Zhang, Chongzhi, Peng, Zhiping, Zheng, Junhao, Wang, Linghao, Shi, Ruifeng, Ma, Qianli

arXiv.org Artificial Intelligence

Complex Logical Query Answering (CLQA) over incomplete knowledge graphs is a challenging task. Recently, Query Embedding (QE) methods are proposed to solve CLQA by performing multi-hop logical reasoning. However, most of them only consider historical query context information while ignoring future information, which leads to their failure to capture the complex dependencies behind the elements of a query. In recent years, the transformer architecture has shown a strong ability to model long-range dependencies between words. The bidirectional attention mechanism proposed by the transformer can solve the limitation of these QE methods regarding query context. Still, as a sequence model, it is difficult for the transformer to model complex logical queries with branch structure computation graphs directly. To this end, we propose a neural one-point embedding method called Pathformer based on the tree-like computation graph, i.e., query computation tree. Specifically, Pathformer decomposes the query computation tree into path query sequences by branches and then uses the transformer encoder to recursively encode these path query sequences to obtain the final query embedding. This allows Pathformer to fully utilize future context information to explicitly model the complex interactions between various parts of the path query. Experimental results show that Pathformer outperforms existing competitive neural QE methods, and we found that Pathformer has the potential to be applied to non-one-point embedding space.


Joint Semantics and Data-Driven Path Representation for Knowledge Graph Inference

Niu, Guanglin, Li, Bo, Zhang, Yongfei, Sheng, Yongpan, Shi, Chuan, Li, Jingyang, Pu, Shiliang

arXiv.org Artificial Intelligence

Inference on a large-scale knowledge graph (KG) is of great importance for KG applications like question answering. The path-based reasoning models can leverage much information over paths other than pure triples in the KG, which face several challenges: all the existing path-based methods are data-driven, lacking explainability for path representation. Besides, some methods either consider only relational paths or ignore the heterogeneity between entities and relations both contained in paths, which cannot capture the rich semantics of paths well. To address the above challenges, in this work, we propose a novel joint semantics and data-driven path representation that balances explainability and generalization in the framework of KG embedding. More specifically, we inject horn rules to obtain the condensed paths by the transparent and explainable path composition procedure. The entity converter is designed to transform the entities along paths into the representations in the semantic level similar to relations for reducing the heterogeneity between entities and relations, in which the KGs both with and without type information are considered. Our proposed model is evaluated on two classes of tasks: link prediction and path query answering task. The experimental results show that it has a significant performance gain over several different state-of-the-art baselines.


Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders

Kotnis, Bhushan, Lawrence, Carolin, Niepert, Mathias

arXiv.org Artificial Intelligence

Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bi-Directional Query Embedding (\textsc{BiQE}), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce a new dataset for predicting the answer of conjunctive query and conduct experiments that show \textsc{BiQE} significantly outperforming state of the art baselines.


Computationally and statistically efficient learning of causal Bayes nets using path queries

Bello, Kevin, Honorio, Jean

Neural Information Processing Systems

Causal discovery from empirical data is a fundamental problem in many scientific domains. Observational data allows for identifiability only up to Markov equivalence class. In this paper we first propose a polynomial time algorithm for learning the exact correctly-oriented structure of the transitive reduction of any causal Bayesian network with high probability, by using interventional path queries. Each path query takes as input an origin node and a target node, and answers whether there is a directed path from the origin to the target. This is done by intervening on the origin node and observing samples from the target node.


Comparing Graph Databases

#artificialintelligence

A relational database has a ledger-style structure. It can be queried through SQL, and it is what most people are familiar with. Each entry is composed of a row in a table. Tables are related by foreign-key constraints, which is how you can connect one table's information to another, like the primary keys. Slow multi-level joins are often involved when querying relational databases.